6,350 research outputs found

    Modulated 3D cross-correlation light scattering: improving turbid sample characterization

    Get PDF
    Accurate characterization using static light scattering (SLS) and dynamic light scattering (DLS) methods mandates the measurement and analysis of singly-scattered light. In turbid samples, the suppression of multiple scattering is therefore required to obtain meaningful results. One powerful technique for achieving this, known as 3D cross-correlation, uses two simultaneous light scattering experiments performed at the same scattering vector on the same sample volume in order to extract only the single scattering information common to both. Here we present a significant improvement to this method in which the two scattering experiments are temporally separated by modulating the incident laser beams and gating the detector outputs at frequencies exceeding the timescale of the system dynamics. This robust modulation scheme eliminates cross-talk between the two beam- detector pairs and leads to a four-fold improvement in the cross-correlation intercept. We measure the dynamic and angular-dependent scattering intensity of turbid colloidal suspensions and exploit the improved signal quality of the modulated 3D cross-correlation DLS and SLS techniques.Comment: Review of Scientific Instruments, accepted for publicatio

    06-07 “The Economics of Inaction on Climate Change: A Sensitivity Analysis”

    Get PDF
    Economic models of climate change often take the problem seriously, but paradoxically conclude that the optimal policy is to do almost nothing about it. We explore this paradox as seen in the widely used DICE model. Three aspects of that model, involving the discount rate, the assumed benefits of moderate warming, and the treatment of the latest climate science, are sufficient to explain the timidity of the model's optimal policy recommendation. With modifications to those three points, DICE shows that the optimal policy is a much higher and rapidly rising marginal carbon price; that higher carbon price has a greater effect on physical measures of climate impacts. Our modifications exhibit nonlinear interactions; at least at low discount rates, there is synergy between individual changes to the model. At low discount rates, the inherent uncertainty about future damages looms larger in the analysis, rendering long-run economic modeling less useful. Our analysis highlights the sensitivity of the model to three debatable assumptions; it does not, and could not, lead to a more reliably “optimal” cost of carbon. Cost-effectiveness analysis, focusing on the generally shorter-term cost side of the problem, reduces the economic paradoxes of the long run, and may make a greater contribution than economic optimization modeling.

    Measurements of Masses in SUGRA Models at LHC

    Get PDF
    This paper presents new measurements in a case study of the minimal SUGRA model with m_0=100 GeV, mhalf=300 GeV, A_0=0, tan(beta)=2.1, and mu=+1 based on four-body distributions from three-step decays and on minimum masses in such decays. These measurements allow masses of supersymmetric particles to be determined without relying on a model. The feasibility of testing slepton universality at the ~0.1% level at high luminosity is discussed. In addition, the effect of enlarging the parameter space of the minimal SUGRA model is discussed. The direct production of left handed sleptons and the non-observation of additional structure in the dilepton invariant mass distributions is shown to provide additional constraints.Comment: 30 pages, 22 figure

    Matched Filtering of Numerical Relativity Templates of Spinning Binary Black Holes

    Full text link
    Tremendous progress has been made towards the solution of the binary-black-hole problem in numerical relativity. The waveforms produced by numerical relativity will play a role in gravitational wave detection as either test-beds for analytic template banks or as template banks themselves. As the parameter space explored by numerical relativity expands, the importance of quantifying the effect that each parameter has on first the detection of gravitational waves and then the parameter estimation of their sources increases. In light of this, we present a study of equal-mass, spinning binary-black-hole evolutions through matched filtering techniques commonly used in data analysis. We study how the match between two numerical waveforms varies with numerical resolution, initial angular momentum of the black holes and the inclination angle between the source and the detector. This study is limited by the fact that the spinning black-hole-binaries are oriented axially and the waveforms only contain approximately two and a half orbits before merger. We find that for detection purposes, spinning black holes require the inclusion of the higher harmonics in addition to the dominant mode, a condition that becomes more important as the black-hole-spins increase. In addition, we conduct a preliminary investigation of how well a template of fixed spin and inclination angle can detect target templates of arbitrary spin and inclination for the axial case considered here

    How Does the Benefit Value of Medicare Compare to the Benefit Value of Typical Large Employer Plans?: A 2012 Update

    Get PDF
    Compares the value of benefits for those age 65 and older under Medicare and under two large employer plans typical of those for which premium support could be offered under reform proposals. Examines share of costs paid by the plan and by individuals

    Search and planning under incomplete information : a study using Bridge card play

    Get PDF
    This thesis investigates problem-solving in domains featuring incomplete information and multiple agents with opposing goals. In particular, we describe Finesse --- a system that forms plans for the problem of declarer play in the game of Bridge. We begin by examining the problem of search. We formalise a best defence model of incomplete information games in which equilibrium point strategies can be identified, and identify two specific problems that can affect algorithms in such domains. In Bridge, we show that the best defence model corresponds to the typical model analysed in expert texts, and examine search algorithms which overcome the problems we have identified. Next, we look at how planning algorithms can be made to cope with the difficulties of such domains. This calls for the development of new techniques for representing uncertainty and actions with disjunctive effects, for coping with an opposition, and for reasoning about compound actions. We tackle these problems with a..

    A Systematic Experimental and Computational Investigation of a Class of Contoured Wall Fuel Injectors

    Get PDF
    The performance of a particular class of fuel injectors for scramjet engine applications is addressed. The contoured wall injectors were aimed at augmenting mixing through axial vorticity production arising from interaction of the fueVair interface with an oblique shock. Helium was used to simulate hydrogen fuel and was injected at Mach 1.7 into a Mach 6 airstream. The effects of incoming boundary layer height. injector spacing, and injectant to freestream pressure and velocity ratios were investigated. Results from threedimensional flow field surveys and Navier-Stokes simulations are presented. Performance was judged in terms of mixing, loss generation and jet penetration. Injector performance was strongly dependent on the displacement effect of the hypersonic boundary layer which acted to modify the effective wall geometry. The impact of the boundary layer varied with injector array spacing. Widely-spaced arrays were more resilient to the detrimental effects of large boundary layers. Strong dependence on injectant to free stream pressure ratio was also displayed. Pressure ratios near unity were most conducive to losseffective mixing and strong jet penetration. Effects due to variation in mean shear associated with non-unity velocity ratios were found to be secondary within the small range of values tested

    Clustering documents with active learning using Wikipedia

    Get PDF
    Wikipedia has been applied as a background knowledge base to various text mining problems, but very few attempts have been made to utilize it for document clustering. In this paper we propose to exploit the semantic knowledge in Wikipedia for clustering, enabling the automatic grouping of documents with similar themes. Although clustering is intrinsically unsupervised, recent research has shown that incorporating supervision improves clustering performance, even when limited supervision is provided. The approach presented in this paper applies supervision using active learning. We first utilize Wikipedia to create a concept-based representation of a text document, with each concept associated to a Wikipedia article. We then exploit the semantic relatedness between Wikipedia concepts to find pair-wise instance-level constraints for supervised clustering, guiding clustering towards the direction indicated by the constraints. We test our approach on three standard text document datasets. Empirical results show that our basic document representation strategy yields comparable performance to previous attempts; and adding constraints improves clustering performance further by up to 20%

    Employing Helicity Amplitudes for Resummation

    Get PDF
    Many state-of-the-art QCD calculations for multileg processes use helicity amplitudes as their fundamental ingredients. We construct a simple and easy-to-use helicity operator basis in soft-collinear effective theory (SCET), for which the hard Wilson coefficients from matching QCD onto SCET are directly given in terms of color-ordered helicity amplitudes. Using this basis allows one to seamlessly combine fixed-order helicity amplitudes at any order they are known with a resummation of higher-order logarithmic corrections. In particular, the virtual loop amplitudes can be employed in factorization theorems to make predictions for exclusive jet cross sections without the use of numerical subtraction schemes to handle real-virtual infrared cancellations. We also discuss matching onto SCET in renormalization schemes with helicities in 44- and dd-dimensions. To demonstrate that our helicity operator basis is easy to use, we provide an explicit construction of the operator basis, as well as results for the hard matching coefficients, for pp→H+0,1,2pp\to H + 0,1,2 jets, pp→W/Z/γ+0,1,2pp\to W/Z/\gamma + 0,1,2 jets, and pp→2,3pp\to 2,3 jets. These operator bases are completely crossing symmetric, so the results can easily be applied to processes with e+e−e^+e^- and e−pe^-p collisions.Comment: 41 pages + 20 pages in Appendices, 1 figure, v2: journal versio

    Grillparzer's adoption and adaptation of the philosophy and vocabulary of Weimar classicism

    Get PDF
    After a summary of German Classicism and of Grillparzer's at times confusing references to it, the main body of the thesis aims to assess Grillparzer's use of the philosophy and vocabulary of Classicism, with particular reference to his ethical, social and political ideas.Grillparzer'8 earliest work, including Blanka, leans heavily on Goethe and Schiller, but such plagiarism is avoided after 1810. Following the success of Ahnfrau, however, Grillparzer returns to a much more widespread use of Classical themes, motifs and vocabulary, especially in Sappho. Grillparzer's mood in the period 1816-21 was one of introversion and pessimism, and there is an emphasis on the vocabulary of quiet peace and withdrawal in Vließ, these ideals cannot help man out of the disaster and despair which Grillparzer repeatedly depicts in the 1810s and early 1820s, and there is a consequent tendency for the optimistic vocabulary of Classicism to appear incongruous. The more political plays of the 1820s reject the style and vocabulary of Classicism but still retain its central moral ideals. From I830 onwards, Grillparzer begins to examine more closely those ideals and concepts inherited from Goethe and Schiller, which had been doomed to failure in the pessimistic atmosphere of earlier years. The very validity of such ideals is now appraised, their relevance in political situations which Classicism had often neglected to depict. It is recognised that ideals considered as absolutes can only be achieved in isolation from chaotic human reality, and that any attempt to transfer aesthetic ideals to political and moral spheres may be detrimental to humanity rather than advantageous. There is a gradual return to Classical concepts such as moderation, limitation, right, truth, and especially "der Mensch", but these ideals must be standards for, not barriers to life and humanity
    • …
    corecore